Corelab Seminar
2010-2011
John Panageas
Differential Privacy
Abstract.
Search engines, hospitals etc possess huge amounts of personal sensitive information that have to be private.
In our case, suppose we have some private data and we want to find an algorithm that returns a "good" solution as far as these private data are
concerned. Differential privacy means the constraint that few changes in the private data also cause small changes at the probability distribution of
the output of the algorithm. That constraint of differential privacy is the key to achieve approximate truthfulness in mechanism design or even exact
truthfulness. The idea is simple, if agent i misreports her utility function, then the "output" barely changes, so she gains little or nothing by deviating.